# Multilingual long text processing
Nomic Xlm 2048
A fine-tuned version based on the XLM-Roberta base model, using RoPE (Rotary Position Embedding) to replace the original positional embeddings, supporting 2048 sequence length
Large Language Model
Transformers

N
nomic-ai
440
6
Xlm Roberta Longformer Base 16384
MIT
A multilingual Longformer model initialized with XLM-RoBERTa weights, supporting a context length of 16384 and suitable for fine-tuning on downstream tasks.
Large Language Model
Transformers

X
severinsimmler
5,438
26
Featured Recommended AI Models